385 research outputs found

    Gauging risk with higher moments : handrails in measuring and optimising conditional value at risk

    Get PDF
    The aim of the paper is to study empirically the influence of higher moments of the return distribution on conditional value at risk (CVaR). To be more exact, we attempt to reveal the extent to which the risk given by CVaR can be estimated when relying on the mean, standard deviation, skewness and kurtosis. Furthermore, it is intended to study how this relationship can be utilised in portfolio optimisation. First, based on a database of 600 individual equity returns from 22 emerging world markets, factor models incorporating the first four moments of the return distribution have been constructed at different confidence levels for CVaR, and the contribution of the identified factors in explaining CVaR was determined. Following this the influence of higher moments was examined in portfolio context, i.e. asset allocation decisions were simulated by creating emerging market portfolios from the viewpoint of US investors. This can be regarded as a normal decisionmaking process of a hedge fund focusing on investments into emerging markets. In our analysis we compared and contrasted two approaches with which one can overcome the shortcomings of the variance as a risk measure. First of all, we solved in the presence of conflicting higher moment preferences a multi-objective portfolio optimisation problem for different sets of preferences. In addition, portfolio optimisation was performed in the mean-CVaR framework characterised by using CVaR as a measure of risk. As a part of the analysis, the pair-wise comparison of the different higher moment metrics of the meanvariance and the mean-CVaR efficient portfolios were also made. Throughout the work special attention was given to implied preferences to the different higher moments in optimising CVaR. We also examined the extent to which model risk, namely the risk of wrongly assuming normally-distributed returns can deteriorate our optimal portfolio choice. JEL Classification: G11, G15, C6

    Doctor of Philosophy

    Get PDF
    dissertationDataflow pipeline models are widely used in visualization systems. Despite recent advancements in parallel architecture, most systems still support only a single CPU or a small collection of CPUs such as a SMP workstation. Even for systems that are specifically tuned towards parallel visualization, their execution models only provide support for data-parallelism while ignoring taskparallelism and pipeline-parallelism. With the recent popularization of machines equipped with multicore CPUs and multi-GPU units, these visualization systems are undoubtedly falling further behind in reaching maximum efficiency. On the other hand, there exist several libraries that can schedule program executions on multiple CPUs and/or multiple GPUs. However, due to differences in executing a task graph and a pipeline along with their APIs being considerably low-level, it still remains a challenge to integrate these run-time libraries into current visualization systems. Thus, there is a need for a redesigned dataflow architecture to fully support and exploit the power of highly parallel machines in large-scale visualization. The new design must be able to schedule executions on heterogeneous platforms while at the same time supporting arbitrarily large datasets through the use of streaming data structures. The primary goal of this dissertation work is to develop a parallel dataflow architecture for streaming large-scale visualizations. The framework includes supports for platforms ranging from multicore processors to clusters consisting of thousands CPUs and GPUs. We achieve this in our system by introducing the notion of Virtual Processing Elements and Task-Oriented Modules along with a highly customizable scheduler that controls the assignment of tasks to elements dynamically. This creates an intuitive way to maintain multiple CPU/GPU kernels yet still provide coherency and synchronization across module executions. We have implemented these techniques into HyperFlow which is made of an API with all basic dataflow constructs described in the dissertation, and a distributed run-time library that can be used to deploy those pipelines on multicore, multi-GPU and cluster-based platforms

    Dynamic Asset Allocation with Regime Shifts and Long Horizon CVaR-Constraints

    Get PDF
    We analyse portfolio policies for investors who invest optimally for given investment horizons with respect to Conditional Value-at-Risk constraints. We account for nonnormally distributed, skewed, and leptokurtic asset return distributions due to regime shifts. The focus is on standard CRRA utility with a money back guarantee at maturity, which is often augmented to individual retirement plans. Optimal solutions for the unconstrained as well as the constrained policy are provided and examined for risk management costs calculated as welfare losses. Our results confirm previous findings that money back guarantees yield mild downside protection at low economic costs for most long term investors
    • …
    corecore